We concluded that Linear Search has a worst-case time complexity of $O(n)$, meaning its running time grows proportionally to the input size $n$. To understand this performance guarantee precisely, we must analyze the exact number of basic operations performed in the worst possible scenario.
- The worst-case running time, denoted $f(n)$, is the maximum number of basic operations an algorithm requires for any input of size $n$.
- Worst Case for Linear Search: The target $t$ is either the last element of array $A$, or it is not present at all. In both cases, the algorithm performs the maximum number of comparisons, requiring $n$ iterations.
- For a standard implementation of Linear Search on an array of size $n$, the running time can be modeled as a specific linear function based on its elementary operations.
Worst-Case Operation Count
| Operation | Cost per Instance | Total Instances (Worst Case) | Total Operations |
|---|---|---|---|
| Loop condition check ($i < n$) | 1 | $n + 1$ | $n + 1$ |
| Array indexing ($A[i]$) | 1 | $n$ | $n$ |
| Target comparison ($A[i] == t$) | 1 | $n$ | $n$ |
| Initial/Final Operations | (Various) | 1 | 2 |
Summing these costs, the exact worst-case running time function is:
$$f(n) = (n+1) + n + n + 2 = 3n + 3$$Because $f(n)$ is a linear function, we confirm that the worst-case time complexity is indeed $O(n)$, and this is the performance guarantee of the algorithm.